Skip to content

Conversation

@Jianhong-Zhang
Copy link

This pr is depending on #1873

@Jianhong-Zhang Jianhong-Zhang force-pushed the ovis-padding branch 5 times, most recently from 8201c51 to 2fabbb1 Compare September 23, 2025 21:43
@yeonsily yeonsily mentioned this pull request Oct 1, 2025
yeonsily added a commit that referenced this pull request Oct 1, 2025
From #1940

---------

Co-authored-by: Christopher Manteuffel <[email protected]>
Co-authored-by: Jianhong-Zhang <[email protected]>
SupreetSinghPalne pushed a commit that referenced this pull request Oct 9, 2025
From #1940

---------

Co-authored-by: Christopher Manteuffel <[email protected]>
Co-authored-by: Jianhong-Zhang <[email protected]>
SupreetSinghPalne pushed a commit that referenced this pull request Oct 16, 2025
From #1940

---------

Co-authored-by: Christopher Manteuffel <[email protected]>
Co-authored-by: Jianhong-Zhang <[email protected]>
SupreetSinghPalne pushed a commit that referenced this pull request Oct 16, 2025
From #1940

---------

Co-authored-by: Christopher Manteuffel <[email protected]>
Co-authored-by: Jianhong-Zhang <[email protected]>
@PatrykWo
Copy link

Starting from v1.23.0, the vLLM fork will reach end-of-life (EOL) and be deprecated in v1.24.0, remaining functional only for legacy use cases until then. At the same time, the vllm-gaudi plugin will be production-ready in v1.23.0. Contributing to HabanaAI:habana_main is closed.

@PatrykWo PatrykWo closed this Oct 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants